Supplementary Material: Online Kernel Learning with a Near Optimal Sparsity Bound

نویسندگان

  • Lijun Zhang
  • Jinfeng Yi
  • Rong Jin
چکیده

A. Proof of Theorem 2 We here prove a lower bound on the number of support vectors to achieve the optimal regret bound. First, we construct a set of n examples T1 = {(xi, yi)}i=1, where 〈κ(xi, ·), κ(xj , ·)〉Hκ = δij and yi ∈ {1,−1}. To make the construction, consider the degree-d polynomial kernel κ(x,y) = (xy) and an Euclidean space R where m > n. Since m > n, we can find a set of orthonormal vectors {x1, . . . ,xn} in R such that xi xj = 0 when i 6= j and xi xi = 1. It is easy to verify that this construction satisfies our assumption κ(xi,xj) = δij . For the Gaussian kernel, when the distance between xi and xj are large enough, we also have κ(xi,xj) ≈ δij . Based on T1, we construct another set T2: (z, u) ∈ T2 if there exist an index j ∈ [n] and a function ξ ∈ Hκ such that κ(z, ·) = κ(xj , ·) + ξ, u = yj , and 〈ξ, κ(xi, ·)〉Hκ = 0, ∀i ∈ [n]. (13) Thus, for each (z, u) ∈ T2, there is a corresponding (xj , yj) ∈ T1 such that the relationships in (13) hold. The existence of T2 can be proved in a similar way as that of T1. Second, we select T distinct training examples (z1, u1), . . . , (zT , uT ) from T2 such that, for each (x, y) ∈ T1 there are T/n examples constructed from it. Taking logit loss l(y, z) = ln(1 + exp(−yz)) as an example. From the above constructions, it is easy to check that

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Online Kernel Learning with a Near Optimal Sparsity Bound

In this work, we focus on Online Sparse Kernel Learning that aims to online learn a kernel classifier with a bounded number of support vectors. Although many online learning algorithms have been proposed to learn a sparse kernel classifier, most of them fail to bound the number of support vectors used by the final solution which is the average of the intermediate kernel classifiers generated by...

متن کامل

Sparsity in machine learning: theory and practice

The thesis explores sparse machine learning algorithms for supervised (classification and re­ gression) and unsupervised (subspace methods) learning. For classification, we review the set covering machine (SCM) and propose new algorithms th a t directly minimise the SCMs sample compression generalisation error bounds during the training phase. Two of the resulting algo­ rithm s are proved to pr...

متن کامل

Approximation Vector Machines for Large-scale Online Learning

One of the most challenging problems in kernel online learning is to bound the model size and to promote model sparsity. Sparse models not only improve computation and memory usage, but also enhance the generalization capacity – a principle that concurs with the law of parsimony. However, inappropriate sparsity modeling may also significantly degrade the performance. In this paper, we propose A...

متن کامل

Structured Sparsity and Generalization

We present a data dependent generalization bound for a large class of regularized algorithms which implement structured sparsity constraints. The bound can be applied to standard squared-norm regularization, the Lasso, the group Lasso, some versions of the group Lasso with overlapping groups, multiple kernel learning and other regularization schemes. In all these cases competitive results are o...

متن کامل

Adaptive Control of a Class of Nonlinear Discrete-Time Systems with Online Kernel Learning

An Online Kernel Learning based Adaptive Control (OKL-AC) framework for discrete-time affine nonlinear systems is presented in this paper. A sparsity strategy is proposed to control the complexity of OKL identification model, meanwhile to make a trade-off between the demanded tracking precision and the complexity of the control law. The forward increasing and backward decreasing learning stages...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013